Layer-wise relevance propagation for interpreting LSTM-RNN decisions in predictive maintenance
نویسندگان
چکیده
Predictive maintenance (PdM) is an advanced technique to predict the time failure (TTF) of a system. PdM collects sensor data on health system, processes information using analytics, and then establishes data-driven models that can forecast system failure. Deep neural networks are increasingly being used as these owing their high predictive accuracy efficiency. However, deep often criticized “black boxes,” which multi-layered non-linear structure provide little insight into underlying physics monitored nontransparent untraceable in predictions. In order address this issue, layer-wise relevance propagation (LRP) applied analyze long short-term memory (LSTM) recurrent network (RNN) model. The proposed method demonstrated validated for bearing monitoring study based vibration data. obtained LRP results insights how model “learns” from input demonstrate distribution contribution/relevance classification space. addition, comparisons made with gradient-based sensitivity analysis show power interpreting RNN models. proved have promising potential improving efficiency PdM.
منابع مشابه
On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
Understanding and interpreting classification decisions of automated image classification systems is of high value in many applications, as it allows to verify the reasoning of the system and provides additional information to the human expert. Although machine learning methods are solving very successfully a plethora of tasks, they have in most cases the disadvantage of acting as a black box, ...
متن کاملInterpreting the Predictions of Complex ML Models by Layer-wise Relevance Propagation
Complex nonlinear models such as deep neural network (DNNs) have become an important tool for image classification, speech recognition, natural language processing, and many other fields of application. These models however lack transparency due to their complex nonlinear structure and to the complex data distributions to which they typically apply. As a result, it is difficult to fully charact...
متن کاملLayer-wise Relevance Propagation for Deep Neural Network Architectures
We present the application of layer-wise relevance propagation to several deep neural networks such as the BVLC reference neural net and googlenet trained on ImageNet and MIT Places datasets. Layerwise relevance propagation is a method to compute scores for image pixels and image regions denoting the impact of the particular image region on the prediction of the classifier for one particular te...
متن کاملLayer-Wise Relevance Propagation for Neural Networks with Local Renormalization Layers
Layer-wise relevance propagation is a framework which allows to decompose the prediction of a deep neural network computed over a sample, e.g. an image, down to relevance scores for the single input dimensions of the sample such as subpixels of an image. While this approach can be applied directly to generalized linear mappings, product type non-linearities are not covered. This paper proposes ...
متن کاملBeyond saliency: understanding convolutional neural networks from saliency prediction on layer-wise relevance propagation
Despite the tremendous achievements of deep convolutional neural networks (CNNs) in most of computer vision tasks, understanding how they actually work remains a significant challenge. In this paper, we propose a novel two-step visualization method that aims to shed light on how deep CNNs recognize images and the objects therein. We start out with a layer-wise relevance propagation (LRP) step w...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: The International Journal of Advanced Manufacturing Technology
سال: 2021
ISSN: ['1433-3015', '0268-3768']
DOI: https://doi.org/10.1007/s00170-021-07911-9